skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Search for: All records

Creators/Authors contains: "Messenger, Daniel_A"

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. Abstract In this work we study the asymptotic consistency of the weak-form sparse identification of nonlinear dynamics algorithm (WSINDy) in the identification of differential equations from noisy samples of solutions. We prove that the WSINDy estimator is unconditionally asymptotically consistent for a wide class of models that includes the Navier–Stokes, Kuramoto–Sivashinsky and Sine–Gordon equations. We thus provide a mathematically rigorous explanation for the observed robustness to noise of weak-form equation learning. Conversely, we also show that, in general, the WSINDy estimator is only conditionally asymptotically consistent, yielding discovery of spurious terms with probability one if the noise level exceeds a critical threshold $$\sigma _{c}$$. We provide explicit bounds on $$\sigma _{c}$$ in the case of Gaussian white noise and we explicitly characterize the spurious terms that arise in the case of trigonometric and/or polynomial libraries. Furthermore, we show that, if the data is suitably denoised (a simple moving average filter is sufficient), then asymptotic consistency is recovered for models with locally-Lipschitz, polynomial-growth nonlinearities. Our results reveal important aspects of weak-form equation learning, which may be used to improve future algorithms. We demonstrate our findings numerically using the Lorenz system, the cubic oscillator, a viscous Burgers-growth model and a Kuramoto–Sivashinsky-type high-order PDE. 
    more » « less